- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources2
- Resource Type
-
0002000000000000
- More
- Availability
-
20
- Author / Contributor
- Filter by Author / Creator
-
-
Smith, Sophie (2)
-
Bordeaux, Clementine (1)
-
Cabrera, Alexis (1)
-
Clark, Douglas (1)
-
Clark, Heather (1)
-
DeLiema, David (1)
-
Ehrenfeld, Nadav (1)
-
Enyedy, Noel (1)
-
Eyes, Renee White (1)
-
Faulstich, Elisa Noemí (1)
-
Flood, Virginia J. (1)
-
Garner, Brette (1)
-
Gravell, Jamie (1)
-
Hall, Rogers (1)
-
Harrer, Benedikt (1)
-
Joshi, Gauri (1)
-
Keifert, Danielle (1)
-
Lindberg, Lindsay (1)
-
Mallick, Ankur (1)
-
Marin, Ananda (1)
-
- Filter by Editor
-
-
Gresalfi, M.S. (1)
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
& Spitzer, S.M. (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Machine learning today involves massive distributed computations running on cloud servers, which are highly susceptible to slowdown or straggling. Recent work has demonstrated the effectiveness of erasure codes in mitigating such slowdown for linear computations, by adding redundant computations such that the entire computation can be recovered as long as a subset of nodes finish their assigned tasks. However, most machine learning algorithms typically involve non-linear computations that cannot be directly handled by these coded computing approaches. In this work, we propose a coded computing strategy for mitigating the effect of stragglers on non-linear distributed computations. Our strategy relies on the observation that many expensive non-linear functions can be decomposed into sums of cheap non-linear functions. We show that erasure codes, specifically rateless codes can be used to generate and compute random linear combinations of these functions at the nodes such that the original function can be computed as long as a subset of nodes return their computations. Simulations and experiments on AWS Lambda demonstrate the superiority of our approach over various uncoded baselines.more » « less
-
Keifert, Danielle; Hall, Rogers; Enyedy, Noel; Vogelstein, Lauren; Pierson, Ashlyn; Ehrenfeld, Nadav; Marshall, Samantha; McGugan, Katherine Schneeberger; Marin, Ananda; Faulstich, Elisa Noemí; et al (, The Interdisciplinarity of the Learning Sciences, 14th International Conference of the Learning Sciences (ICLS) 2020)Gresalfi, M.S. (Ed.)Charles Goodwin’s legacy includes a multitude of analytical tools for examining meaning making in interaction. We focus on Goodwin’s substrate—“the local, public configuration of action and semiotic resources” available in interaction used to create shared meanings (Goodwin, 2018, p. 32), gathering early career scholars to explore how research designs adapt substrate as an analytical tool for education research in diverse settings. This structured poster session examines how substrate can be used to capture a complex web of learning phenomena and support important analytical shifts, including representing learning processes, privileging members’ phenomena to address issues of equity, and understanding shifting power relations through multi-layered and multi-scaled analyses.more » « less
An official website of the United States government

Full Text Available